Revisiting the predictive power of kernel principal components

نویسندگان

چکیده

In this short note, recent results on the predictive power of kernel principal component in a regression setting are extended two ways: (1) model-free setting, we relax conditional independence model assumption to obtain stronger result; and (2) is also infinite-dimensional setting.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Probabilistic Analysis of Kernel Principal Components

This paper presents a probabilistic analysis of kernel principal components by unifying the theory of probabilistic principal component analysis and kernel principal component analysis. It is shown that, while the kernel component enhances the nonlinear modeling power, the probabilistic structure offers (i) a mixture model for nonlinear data structure containing nonlinear sub-structures, and (i...

متن کامل

Kernel Principal Components Are Maximum Entropy Projections

Principal Component Analysis (PCA) is a very well known statistical tool. Kernel PCA is a nonlinear extension to PCA based on the kernel paradigm. In this paper we characterize the projections found by Kernel PCA from a information theoretic perspective. We prove that Kernel PCA provides optimum entropy projections in the input space when the Gaussian kernel is used for the mapping and a sample...

متن کامل

Regaining sparsity in kernel principal components

Support Vector Machines are supervised regression and classification machines which have the nice property of automatically identifying which of the data points are most important in creating the machine. Kernel Principal Component Analysis (KPCA) is a related technique in that it also relies on linear operations in a feature space but does not have this ability to identify important points. Sp...

متن کامل

Approximations of the standard principal components analysis and kernel PCA

Principal component analysis (PCA) is a powerful technique for extracting structure from possibly highdimensional data sets, while kernel PCA (KPCA) is the application of PCA in a kernel-defined feature space. For standard PCA and KPCA, if the size of dataset is large, it will need a very large memory to store kernel matrix and a lot of time to calculate eigenvalues and corresponding eigenvecto...

متن کامل

Efficiently updating and tracking the dominant kernel principal components

The dominant set of eigenvectors of the symmetrical kernel Gram matrix is used in many important kernel methods (like e.g. kernel Principal Component Analysis, feature approximation, denoising, compression, prediction) in the machine learning area. Yet in the case of dynamic and/or large-scale data, the batch calculation nature and computational demands of the eigenvector decomposition limit th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Statistics & Probability Letters

سال: 2021

ISSN: ['1879-2103', '0167-7152']

DOI: https://doi.org/10.1016/j.spl.2020.109019